Nearest Neighbours Graph Variational AutoEncoder

نویسندگان

چکیده

Graphs are versatile structures for the representation of many real-world data. Deep Learning on graphs is currently able to solve a wide range problems with excellent results. However, both generation and handling large still remain open challenges. This work aims introduce techniques generating test approach complex problem such as calculation dose distribution in oncological radiotherapy applications. To this end, we introduced pooling technique (ReNN-Pool) capable sampling nodes that spatially uniform without computational requirements model training inference. By construction, ReNN-Pool also allows definition symmetric un-pooling operation recover original dimensionality graphs. We present Variational AutoEncoder (VAE) graphs, based defined operations, which employs convolutional graph layers encoding decoding phases. The performance was tested realistic use case cylindrical dataset application standard benchmark sprite. Compared other techniques, proved improve requirements.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Junction Tree Variational Autoencoder for Molecular Graph Generation

We seek to automate the design of molecules based on specific chemical properties. In computational terms, this task involves continuous embedding and generation of molecular graphs. Our primary contribution is the direct realization of molecular graphs, a task previously approached by generating linear SMILES strings instead of graphs. Our junction tree variational autoencoder generates molecu...

متن کامل

Variational Lossy Autoencoder

Representation learning seeks to expose certain aspects of observed data in a learned representation that’s amenable to downstream tasks like classification. For instance, a good representation for 2D images might be one that describes only global structure and discards information about detailed texture. In this paper, we present a simple but principled method to learn such global representati...

متن کامل

Quantum Variational Autoencoder

Variational autoencoders (VAEs) are powerful generative models with the salient ability to perform inference. Here, we introduce a quantum variational autoencoder (QVAE): a VAE whose latent generative process is implemented as a quantum Boltzmann machine (QBM). We show that our model can be trained end-to-end by maximizing a well-defined loss-function: a “quantum” lowerbound to a variational ap...

متن کامل

Epitomic Variational Autoencoder

In this paper, we propose epitomic variational autoencoder (eVAE), a probabilistic generative model of high dimensional data. eVAE is composed of a number of sparse variational autoencoders called ‘epitome’ such that each epitome partially shares its encoder-decoder architecture with other epitomes in the composition. We show that the proposed model greatly overcomes the common problem in varia...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Algorithms

سال: 2023

ISSN: ['1999-4893']

DOI: https://doi.org/10.3390/a16030143